Using Global Line Searches for Nding Global Minima of Mlp Error Functions

نویسنده

  • C. Pellegrini
چکیده

In this paper we consider a possible improvment of conjugate gradient methods commonly used for training multilayer perceptrons. These methods rely on a line search procedure which is partially responsible for the fact that only local minima are found. We propose to perform line searches along very long lines in order to nd global minima; as usual line search procedures are not suitable in this context, we propose to use powerful algorithms from the eld of global optimization, and in particular Zilinskas's P algorithm. We show that P can not replace the usual line search directly, but can be coupled with a conjugate gradient algorithm in a diierent way. The new algorithm we propose outperforms several related algorithms.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Another Hybrid Algorithm for Nding a Global Mimimum of Mlp Error Functions

This report presents P scg , a new global optimization method for training mul-tilayered perceptrons. Instead of local minima, global minima of the error function are found. This new method is hybrid in the sense that it combines three very diierent optimization techniques: Random Line Search, Scaled Conjugate Gradient and a 1-dimensional minimization algorithm named P. The best points of each ...

متن کامل

Overcoming the Local-Minimum Problem in Training Multilayer Perceptrons with the NRAE-MSE Training Method

A method of training multilayer perceptrons (MLPs) to reach a global or nearly global minimum of the standard mean squared error (MSE) criterion is proposed. It has been found that the region in the weight space that does not have a local minimum of the normalized riskaverting error (NRAE) criterion expands strictly to the entire weight space as the risk-sensitivity index increases to infinity....

متن کامل

Convex Kernel Underestimation of Functions with Multiple Local Minima

A function on R with multiple local minima is approximated from below, via linear programming, by a linear combination of convex kernel functions using sample points from the given function. The resulting convex kernel underestimator is then minimized, using either a linear equation solver for a linear-quadratic kernel or by a Newton method for a Gaussian kernel, to obtain an approximation to a...

متن کامل

Performance Comparison between Back Propagation, Rpe and Mrpe Algorithms for Training Mlp Networks

This paper presents the performance comparison between back propagation, recursive prediction error (RPE) and modified recursive prediction error (MRPE) algorithms for training multilayered perceptron networks. Back propagation is a steepest descent type algorithm that normally has slow convergence rate and the search for the global minimum often becomes trapped at poor local minima. RPE and MR...

متن کامل

Finding Consistent Global Checkpoints in a Distributed Computation

Finding consistent global checkpoints of a distributed computation is important for analyzing, testing, or verifying properties of these computations. In this paper we present a theoretical foundation for nding consistent global checkpoints. Given an arbitrary set S of local checkpoints, we prove exactly which sets of other local checkpoints can be combined with S to build consistent global che...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 1997